-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-6764] Add wheel package support for PySpark #5408
Conversation
Note: But, I could not modify it since I am not familiar with scala/java and Spark internal details. Basically, I think the followings are required.
Is there someone to modify it ? |
can you rebase this PR ? |
@@ -21,6 +21,8 @@ | |||
from threading import Lock | |||
from tempfile import NamedTemporaryFile | |||
|
|||
from pip.commands.install import InstallCommand as pip_InstallCommand |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This introduce a dependency on pip
, which is not available by default. We should put it in a try
.
@takaomag Could you bring this up-to-date? |
Test build #1829 has finished for PR 5408 at commit
|
I really wish this PR hadn't been abandoned -- I'm going to take a look at finishing the requested work in a separate PR over the holidays. |
I'm going to close this pull request. If this is still relevant and you are interested in pushing it forward, please open a new pull request. Thanks! |
I am working on a new version. Will propose a PR soon |
- Merge of apache#13599 ("virtualenv in pyspark", Bug SPARK-13587) - and apache#5408 (wheel package support for Pyspark", bug SPARK-6764) - Documentation updated - only Standalone and YARN supported. Mesos not supported - only tested with virtualenv/pip. Conda not tested - client deployment + pip install w/ index: ok (1 min 30 exec) - client deployment + wheelhouse w/o index: ko (cffi refuse the builded wheel) Signed-off-by: Gaetan Semet <[email protected]>
Support wheel packaging for PySpark.